Threshold Gradient Descent Regularization in the Additive Risk Model with High-Dimensional Covariates
نویسندگان
چکیده
An additive risk model is a useful alternative to the Cox model (Cox, 1972) and may be adopted when the absolute effects, instead of the relative effects, of multiple predictors on the hazard function are of interest. In this article, we propose using the threshold gradient descent regularization (TGDR) method for covariate selection, estimation and prediction in the additive risk model for right censored survival data when the dimension of the covariate vector can be greater than the sample size. Such " small n, large p " problems may arise in the investigation of association between survival times and gene expression profiles. We propose using the V-fold cross validation and a modified Akaike's Information Criterion (AIC) for tuning parameter selections. The proposed method is demonstrated with two real data examples. The results show that the TGDR is effective in model reduction and can identify more relevant covariates than the least absolute shrinkage and selection operator approach.
منابع مشابه
A Sparse-group Lasso
For high dimensional supervised learning problems, often using problem specific assumptions can lead to greater accuracy. For problems with grouped covariates, which are believed to have sparse effects both on a group and within group level, we introduce a regularized model for linear regression with `1 and `2 penalties. We discuss the sparsity and other regularization properties of the optimal...
متن کاملParameterless Optimal Approximate Message Passing
Iterative thresholding algorithms are well-suited for high-dimensional problems in sparse recovery and compressive sensing. The performance of this class of algorithms depends heavily on the tuning of certain threshold parameters. In particular, both the final reconstruction error and the convergence rate of the algorithm crucially rely on how the threshold parameter is set at each step of the ...
متن کاملHigh-Dimensional Sparse Additive Hazards Regression
High-dimensional sparse modeling with censored survival data is of great practical importance, as exemplified by modern applications in high-throughput genomic data analysis and credit risk analysis. In this article, we propose a class of regularization methods for simultaneous variable selection and estimation in the additive hazards model, by combining the nonconcave penalized likelihood appr...
متن کاملNonparametric regression and classification with joint sparsity constraints
We propose new families of models and algorithms for high-dimensional nonparametric learning with joint sparsity constraints. Our approach is based on a regularization method that enforces common sparsity patterns across different function components in a nonparametric additive model. The algorithms employ a coordinate descent approach that is based on a functional soft-thresholding operator. T...
متن کاملThreshold Gradient Descent Method for Censored Data Regression with Applications in Pharmacogenomics
An important area of research in pharmacogenomics is to relate high-dimensional genetic or genomic data to various clinical phenotypes of patients. Due to large variability in time to certain clinical event among patients, studying possibly censored survival phenotypes can be more informative than treating the phenotypes as categorical variables. In this paper, we develop a threshold gradient d...
متن کامل